Phenology_vids¶
Purpose: Create time lapse videos of a region for any location¶
Steps:¶
- Link to DataCube
- Specify time frame, central location, buffer size
- Download Sentinel data
- Generate indices as needed.
- Resample weekly???
- run xr_animation and save plots
- option to load polygon data (e.g. cadastrals)???
- if loading cadastrals, show trace plots of indices within polygons???import matplotlib.pyplot as plt
In [1]:
%matplotlib inline
import os
import datacube
import numpy as np
import pandas as pd
import xarray as xr
import datetime as dt
import matplotlib.pyplot as plt
import sys
sys.path.insert(1, '../Tools/')
from dea_tools.temporal import xr_phenology, temporal_statistics
from dea_tools.datahandling import load_ard
from dea_tools.bandindices import calculate_indices
from dea_tools.plotting import display_map, rgb
from dea_tools.dask import create_local_dask_cluster
from dea_tools.plotting import display_map, rgb, xr_animation
import skimage
# Plot animation
from IPython.display import Image
from IPython.core.display import Video
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)
# Create local dask cluster to improve data load time
client = create_local_dask_cluster(return_client=True)
2024-07-18 16:58:22,389 - distributed.nanny.memory - WARNING - Ignoring provided memory limit 131415622144 due to system memory limit of 32.00 GiB
Client
Client-1e6bbd08-44d3-11ef-aeaa-000007aafe80
| Connection method: Cluster object | Cluster type: distributed.LocalCluster |
| Dashboard: /proxy/8787/status |
Cluster Info
LocalCluster
c5c93198
| Dashboard: /proxy/8787/status | Workers: 1 |
| Total threads: 7 | Total memory: 32.00 GiB |
| Status: running | Using processes: True |
Scheduler Info
Scheduler
Scheduler-301331e6-d4ee-4234-b16c-4da46ec8d636
| Comm: tcp://127.0.0.1:37955 | Workers: 1 |
| Dashboard: /proxy/8787/status | Total threads: 7 |
| Started: Just now | Total memory: 32.00 GiB |
Workers
Worker: 0
| Comm: tcp://127.0.0.1:43577 | Total threads: 7 |
| Dashboard: /proxy/45029/status | Memory: 32.00 GiB |
| Nanny: tcp://127.0.0.1:38609 | |
| Local directory: /jobfs/121050713.gadi-pbs/dask-scratch-space/worker-6_b4avf1 | |
In [2]:
dc = datacube.Datacube(app='Vegetation_phenology')
In [3]:
# Define area of interest
# # 186 Milgadara Rd, Barwang NSW: -34.38904277303204, 148.46949938279096
# Yelkin -33.47904684379098, 146.3094839864518
# Boomahnoomoona -36.11965805095775, 146.08472404116773
# Adam O'tool site: -33.5040228817206, 148.6385170105664
# Grant Sims multispecies cover crop experiment sites -36.22746736927963, 144.40088864017818
# Spring Valley farm: -35.284243390317, 149.0074353022316
#stub = 'ADAMO'
#stub = "SPRVAL"
stub = "MILG_24"
lat = -34.38904277303204
lon = 148.46949938279096
lon_buffer = 0.01
lat_buffer = 0.01
# Set the range of dates for the analysis
time_range = ('2024-01-01', '2024-08-01') # when is the earliest? 2016?
# Combine central lat,lon with buffer to get area of interest
lat_range = (lat-lat_buffer, lat+lat_buffer)
lon_range = (lon-lon_buffer, lon+lon_buffer)
display_map(x=lon_range, y=lat_range)
Out[3]:
Make this Notebook Trusted to load map: File -> Trust Notebook
Load sentinel data raw¶
In [4]:
# Create a reusable query
query = {
'y': lat_range,
'x': lon_range,
'time': time_range,
'measurements': ['nbart_red', 'nbart_green', 'nbart_blue', 'nbart_nir_1'],
'resolution': (-20, 20),
'output_crs': 'epsg:6933',
'group_by':'solar_day'
}
# Load available data from Sentinel-2
ds = load_ard(
dc=dc,
products=['ga_s2am_ard_3', 'ga_s2bm_ard_3'],
cloud_mask='s2cloudless',
min_gooddata=0.9,
**query,
)
# Shut down Dask client now that we have loaded the data we need
client.close()
# Preview data
ds
Finding datasets
ga_s2am_ard_3
ga_s2bm_ard_3
Counting good quality pixels for each time step using s2cloudless
/g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/rasterio/warp.py:344: NotGeoreferencedWarning: Dataset has no geotransform, gcps, or rpcs. The identity matrix will be returned. _reproject( /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/rasterio/warp.py:344: NotGeoreferencedWarning: Dataset has no geotransform, gcps, or rpcs. The identity matrix will be returned. _reproject(
Filtering to 33 out of 89 time steps with at least 90.0% good quality pixels Applying s2cloudless pixel quality/cloud mask Loading 33 time steps
Out[4]:
<xarray.Dataset>
Dimensions: (time: 33, y: 106, x: 97)
Coordinates:
* time (time) datetime64[ns] 2023-01-09T00:06:28.779134 ... 2024-03...
* y (y) float64 -4.227e+06 -4.227e+06 ... -4.229e+06 -4.23e+06
* x (x) float64 1.438e+07 1.438e+07 ... 1.438e+07 1.438e+07
spatial_ref int32 6933
Data variables:
nbart_red (time, y, x) float32 616.0 706.0 939.0 ... 597.0 622.0 731.0
nbart_green (time, y, x) float32 496.0 627.0 745.0 ... 535.0 542.0 631.0
nbart_blue (time, y, x) float32 340.0 413.0 457.0 ... 401.0 414.0 459.0
nbart_nir_1 (time, y, x) float32 2.303e+03 2.616e+03 ... 1.838e+03 2e+03
Attributes:
crs: epsg:6933
grid_mapping: spatial_refIn [ ]:
In [5]:
# Calculate the chosen vegetation proxy index and add it to the loaded data set
ds = calculate_indices(ds, index=['NDVI','NDWI'], collection='ga_s2_3')
In [6]:
ds_weekly = ds.resample(time="1W").interpolate("linear")
In [7]:
path_animations = '/home/jovyan/Projects/PaddockTS/Results/'
num_frames = 20
num_frames = len(ds_weekly.time) # get total length from ds_weekly
In [10]:
num_frames
Out[10]:
64
In [11]:
# Create animation of weekly resampled NDVI
output = path_animations+stub+'_NDVI_weekly_nomarkup.mp4'
ds_ = ds_weekly.interpolate_na(dim = 'time', method = 'linear')
xr_animation(ds_,
bands=['NDVI'],
output_path = output,
#show_gdf = gdf_,
#gdf_kwargs={"edgecolor": 'blue'},
#gdf_kwargs={"edgecolor": pol['color_edge']}, # Make edge color something different
show_text="NDVI",
imshow_kwargs={"cmap": "Greens", "vmin": 0.0, "vmax": 1.0},
limit=num_frames)
plt.close()
Video(output, embed=True)
Exporting animation to /home/jovyan/Projects/PaddockTS/Results/SPRVAL_NDVI_weekly_nomarkup.mp4
0%| | 0/64 (0.0 seconds remaining at ? frames/s)
--------------------------------------------------------------------------- FileNotFoundError Traceback (most recent call last) Cell In[11], line 6 2 output = path_animations+stub+'_NDVI_weekly_nomarkup.mp4' 4 ds_ = ds_weekly.interpolate_na(dim = 'time', method = 'linear') ----> 6 xr_animation(ds_, 7 bands=['NDVI'], 8 output_path = output, 9 #show_gdf = gdf_, 10 #gdf_kwargs={"edgecolor": 'blue'}, 11 #gdf_kwargs={"edgecolor": pol['color_edge']}, # Make edge color something different 12 show_text="NDVI", 13 imshow_kwargs={"cmap": "Greens", "vmin": 0.0, "vmax": 1.0}, 14 limit=num_frames) 16 plt.close() 17 Video(output, embed=True) File /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/dea_tools/plotting.py:741, in xr_animation(ds, bands, output_path, width_pixels, interval, percentile_stretch, image_proc_funcs, show_gdf, show_date, show_text, show_colorbar, gdf_kwargs, annotation_kwargs, imshow_kwargs, colorbar_kwargs, limit) 739 anim.save(output_path, writer='pillow') 740 else: --> 741 anim.save(output_path, dpi=72) 743 # Update progress bar to fix progress bar moving past end 744 if progress_bar.n != len(ds.time): File /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/animation.py:1089, in Animation.save(self, filename, writer, fps, dpi, codec, bitrate, extra_args, metadata, extra_anim, savefig_kwargs, progress_callback) 1085 savefig_kwargs['transparent'] = False # just to be safe! 1086 # canvas._is_saving = True makes the draw_event animation-starting 1087 # callback a no-op; canvas.manager = None prevents resizing the GUI 1088 # widget (both are likewise done in savefig()). -> 1089 with writer.saving(self._fig, filename, dpi), \ 1090 cbook._setattr_cm(self._fig.canvas, _is_saving=True, manager=None): 1091 for anim in all_anim: 1092 anim._init_draw() # Clear the initial frame File /g/data/v10/public/modules/dea/20231204/lib/python3.10/contextlib.py:135, in _GeneratorContextManager.__enter__(self) 133 del self.args, self.kwds, self.func 134 try: --> 135 return next(self.gen) 136 except StopIteration: 137 raise RuntimeError("generator didn't yield") from None File /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/animation.py:240, in AbstractMovieWriter.saving(self, fig, outfile, dpi, *args, **kwargs) 235 _log.info("Disabling savefig.bbox = 'tight', as it may cause " 236 "frame size to vary, which is inappropriate for " 237 "animation.") 239 # This particular sequence is what contextlib.contextmanager wants --> 240 self.setup(fig, outfile, dpi, *args, **kwargs) 241 with mpl.rc_context({'savefig.bbox': None}): 242 try: File /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/animation.py:327, in MovieWriter.setup(self, fig, outfile, dpi) 325 def setup(self, fig, outfile, dpi=None): 326 # docstring inherited --> 327 super().setup(fig, outfile, dpi=dpi) 328 self._w, self._h = self._adjust_frame_size() 329 # Run here so that grab_frame() can write the data to a pipe. This 330 # eliminates the need for temp files. File /g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/animation.py:195, in AbstractMovieWriter.setup(self, fig, outfile, dpi) 181 """ 182 Setup for writing the movie file. 183 (...) 192 in pixels of the resulting movie file. 193 """ 194 # Check that path is valid --> 195 Path(outfile).parent.resolve(strict=True) 196 self.outfile = outfile 197 self.fig = fig File /g/data/v10/public/modules/dea/20231204/lib/python3.10/pathlib.py:1077, in Path.resolve(self, strict) 1074 raise RuntimeError("Symlink loop from %r" % e.filename) 1076 try: -> 1077 s = self._accessor.realpath(self, strict=strict) 1078 except OSError as e: 1079 check_eloop(e) File /g/data/v10/public/modules/dea/20231204/lib/python3.10/posixpath.py:395, in realpath(filename, strict) 392 """Return the canonical path of the specified filename, eliminating any 393 symbolic links encountered in the path.""" 394 filename = os.fspath(filename) --> 395 path, ok = _joinrealpath(filename[:0], filename, strict, {}) 396 return abspath(path) File /g/data/v10/public/modules/dea/20231204/lib/python3.10/posixpath.py:430, in _joinrealpath(path, rest, strict, seen) 428 newpath = join(path, name) 429 try: --> 430 st = os.lstat(newpath) 431 except OSError: 432 if strict: FileNotFoundError: [Errno 2] No such file or directory: '/home/jovyan'
In [23]:
# Create animation of weekly resampled NDVI
output = path_animations+stub+'_NDVI_nomarkup.mp4'
ds_ = ds.interpolate_na(dim = 'time', method = 'linear')
xr_animation(ds_,
bands=['NDVI'],
#bands=['nbart_red', 'nbart_green', 'nbart_blue'],
output_path = output,
show_text="NDVI",
imshow_kwargs={"cmap": "Greens", "vmin": 0.0, "vmax": 1.0},
limit=num_frames)
plt.close()
Video(output, embed=True)
Exporting animation to /home/jovyan/Projects/PaddockTS/Results/SPRVAL_NDVI_nomarkup.mp4
0%| | 0/216 (0.0 seconds remaining at ? frames/s)
Out[23]:
In [24]:
# RGB actual time series
output = path_animations+stub+'_RGB_nomarkup.mp4'
ds_ = ds.interpolate_na(dim = 'time', method = 'linear')
xr_animation(ds_,
bands=['nbart_red', 'nbart_green', 'nbart_blue'],
output_path = output,
limit=num_frames)
plt.close()
Video(output, embed=True)
Exporting animation to /home/jovyan/Projects/PaddockTS/Results/SPRVAL_RGB_nomarkup.mp4
0%| | 0/216 (0.0 seconds remaining at ? frames/s)
Out[24]:
In [25]:
# RGB weekly resampled time series
output = path_animations+stub+'_RGB_weekly_nomarkup.mp4'
ds_ = ds_weekly.interpolate_na(dim = 'time', method = 'linear')
xr_animation(ds_,
bands=['nbart_red', 'nbart_green', 'nbart_blue'],
output_path = output,
limit=num_frames)
plt.close()
Video(output, embed=True)
Exporting animation to /home/jovyan/Projects/PaddockTS/Results/SPRVAL_RGB_weekly_nomarkup.mp4
0%| | 0/378 (0.0 seconds remaining at ? frames/s)
Out[25]:
In [ ]:
In [13]:
ds_weekly
Out[13]:
<xarray.Dataset>
Dimensions: (time: 368, y: 322, x: 290)
Coordinates:
* y (y) float64 -4.037e+06 -4.037e+06 ... -4.043e+06 -4.043e+06
* x (x) float64 1.434e+07 1.434e+07 ... 1.434e+07 1.434e+07
spatial_ref int32 6933
* time (time) datetime64[ns] 2017-03-12 2017-03-19 ... 2024-03-24
Data variables:
nbart_red (time, y, x) float64 2.1e+03 2.06e+03 1.997e+03 ... nan nan nan
nbart_green (time, y, x) float64 1.43e+03 1.393e+03 1.323e+03 ... nan nan
nbart_blue (time, y, x) float64 1.034e+03 973.2 939.5 ... nan nan nan
nbart_nir_1 (time, y, x) float64 3.251e+03 3.23e+03 3.18e+03 ... nan nan
NDVI (time, y, x) float64 0.2195 0.2235 0.2322 ... nan nan nan
NDWI (time, y, x) float64 -0.3911 -0.3983 -0.4132 ... nan nan nan
Attributes:
crs: epsg:6933
grid_mapping: spatial_refIn [ ]:
Load a different dataset (fractional cover)¶
https://knowledge.dea.ga.gov.au/notebooks/DEA_products/DEA_Fractional_Cover/
In [18]:
import datacube
from datacube.utils import masking
import matplotlib.pyplot as plt
import sys
sys.path.insert(1, '../Tools/')
from dea_tools.datahandling import wofs_fuser
from dea_tools.plotting import rgb, plot_wo
In [13]:
stub = "MILG_24"
lat = -34.38904
lon = 148.46949
lon_buffer = 0.01
lat_buffer = 0.01
# Set the range of dates for the analysis
time_range = ('2020-01-01', '2022-01-01')
# Combine central lat,lon with buffer to get area of interest
lat_range = (lat-lat_buffer, lat+lat_buffer)
lon_range = (lon-lon_buffer, lon+lon_buffer)
display_map(x=lon_range, y=lat_range)
Out[13]:
Make this Notebook Trusted to load map: File -> Trust Notebook
In [14]:
# Create a reusable query
query = {
'y': lat_range,
'x': lon_range,
'time': time_range,
'output_crs': 'epsg:6933',
'group_by':'solar_day'
}
# Load available data from
fc = dc.load(product='ga_ls_fc_3',
measurements=['bs', 'pv', 'npv', 'ue'],
resolution=(-30, 30),
**query)
# Shut down Dask client now that we have loaded the data we need
client.close()
# Preview data
fc
Out[14]:
<xarray.Dataset>
Dimensions: (time: 87, y: 71, x: 66)
Coordinates:
* time (time) datetime64[ns] 2020-01-07T23:56:40.592891 ... 2021-12...
* y (y) float64 -4.133e+06 -4.133e+06 ... -4.135e+06 -4.135e+06
* x (x) float64 1.432e+07 1.432e+07 ... 1.433e+07 1.433e+07
spatial_ref int32 6933
Data variables:
bs (time, y, x) uint8 18 19 19 19 17 17 18 ... 49 47 48 51 52 50
pv (time, y, x) uint8 13 13 13 13 13 13 14 ... 43 38 35 35 32 30
npv (time, y, x) uint8 68 67 66 67 68 69 67 ... 5 6 14 15 13 14 19
ue (time, y, x) uint8 7 7 7 7 7 7 7 7 7 7 ... 8 9 12 9 7 6 6 5 5 4
Attributes:
crs: epsg:6933
grid_mapping: spatial_refIn [19]:
# Replace all nodata values with `NaN`
fc = masking.mask_invalid_data(fc)
In [ ]:
In [20]:
# Plot DEA Fractional Cover data as a false colour RGB image
rgb(fc, bands=['bs', 'pv', 'npv'], col='time')
/g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/cm.py:494: RuntimeWarning: invalid value encountered in cast xx = (xx * 255).astype(np.uint8)
In [24]:
# Plot unmixing error using `robust=True` to drop outliers and improve contrast
fc.ue.plot(col='time', robust=True, size=5)
Out[24]:
<xarray.plot.facetgrid.FacetGrid at 0x14c6b576c610>
In [25]:
# Replace all nodata values with `NaN`
fc = masking.mask_invalid_data(fc)
In [27]:
# Load DEA Water Observations data from the datacube
wo = dc.load(product='ga_ls_wo_3',
group_by='solar_day',
fuse_func=wofs_fuser,
like=fc)
# Plot the loaded water observations
plot_wo(wo.water, col='time', size=5)
Out[27]:
<xarray.plot.facetgrid.FacetGrid at 0x14c6b6658490>
In [29]:
# Keeping only dry, non-cloudy pixels
wo_mask = masking.make_mask(wo.water, dry=True)
wo_mask.plot(col='time', size=5)
Out[29]:
<xarray.plot.facetgrid.FacetGrid at 0x14c6adc8ba00>
In [30]:
# Set any unclear or wet pixel to `NaN`
fc_masked = fc.where(wo_mask)
# Plot the masked fractional cover data
rgb(fc_masked, bands=['bs', 'pv', 'npv'], col='time')
/g/data/v10/public/modules/dea/20231204/lib/python3.10/site-packages/matplotlib/cm.py:494: RuntimeWarning: invalid value encountered in cast xx = (xx * 255).astype(np.uint8)
In [31]:
# Calculate average fractional cover for `bs`, `pv` and `npv` over time
fc_through_time = fc_masked[['pv', 'npv', 'bs']].mean(dim=['x', 'y'])
# Plot the changing proportions as a line graph
fc_through_time.to_array().plot.line(hue='variable', size=5)
plt.title('Fractional cover over time')
plt.ylabel('Percent cover (%)');
In [ ]:
### make RGB video
stub = 'MILG_LS_BPN'
path_animations = '/g/data/xe2/John/Data/PadSeg/'
num_frames = len(ds.time) # get total length from ds_weekly
In [10]:
# RGB actual time series
output = path_animations+stub+'_test.mp4'
ds_ = ds.interpolate_na(dim = 'time', method = 'linear')
xr_animation(ds_,
bands=['bs', 'pv', 'npv'],
output_path = output,
limit=num_frames)
plt.close()
Video(output, embed=True)
Exporting animation to /g/data/xe2/John/Data/PadSeg/MILG_LS_BPN_test.mp4
0%| | 0/408 (0.0 seconds remaining at ? frames/s)
Out[10]:
In [11]:
--------------------------------------------------------------------------- NameError Traceback (most recent call last) Cell In[11], line 4 1 # Load DEA Water Observations data from the datacube 2 wo = dc.load(product='ga_ls_wo_3', 3 group_by='solar_day', ----> 4 fuse_func=wofs_fuser, 5 like=fc) 7 # Plot the loaded water observations 8 plot_wo(wo.water, col='time', size=5) NameError: name 'wofs_fuser' is not defined
In [ ]: